latent variational framework
A Latent Variational Framework for Stochastic Optimization
This paper provides a unifying theoretical framework for stochastic optimization algorithms by means of a latent stochastic variational problem. Using techniques from stochastic control, the solution to the variational problem is shown to be equivalent to that of a Forward Backward Stochastic Differential Equation (FBSDE). By solving these equations, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary latent inference problem on gradients, where a prior measure on gradient observations determines the resulting algorithm.
Reviews: A Latent Variational Framework for Stochastic Optimization
This paper studies a variational theoretical framework for stochastic optimization. In particular, the authors showed that finding the minimizer of stochastic optimization is equivalent to finding the solution of a variational problem over a latent function space and also equivalent to finding the solution of a forward backward SDE. They also showed how to recover some popular stochastic optimization algorithms through discretizing the optimality equations defined by the SDE. However, this part is not so clear in clarity and still requires more discussion on the theoretical behavior of these discretized algorithms. Overall, this paper is well written and has strong results.
Reviews: A Latent Variational Framework for Stochastic Optimization
This paper presents a latent variational framework for designing stochastic optimization algorithms using ideas from stochastic control. The main contribution of the paper is an action functional, such that the corresponding Euler-Lagrange (EL) equations give rise to a system of Forward-Backward stochastic differential equations (FB-SDEs). These equations are generalizations of the ODEs for deterministic optimization obtained by Wibisono et al., 2016. The paper also presents an analysis of the rate of convergence. The reviewers are uniformly positive about this work, and the authors' response has addressed most of their concerns.
A Latent Variational Framework for Stochastic Optimization
This paper provides a unifying theoretical framework for stochastic optimization algorithms by means of a latent stochastic variational problem. Using techniques from stochastic control, the solution to the variational problem is shown to be equivalent to that of a Forward Backward Stochastic Differential Equation (FBSDE). By solving these equations, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary latent inference problem on gradients, where a prior measure on gradient observations determines the resulting algorithm.
A Latent Variational Framework for Stochastic Optimization
This paper provides a unifying theoretical framework for stochastic optimization algorithms by means of a latent stochastic variational problem. Using techniques from stochastic control, the solution to the variational problem is shown to be equivalent to that of a Forward Backward Stochastic Differential Equation (FBSDE). By solving these equations, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary latent inference problem on gradients, where a prior measure on gradient observations determines the resulting algorithm. Papers published at the Neural Information Processing Systems Conference.